-
Notifications
You must be signed in to change notification settings - Fork 116
Updates to make the library applicable for wider range of use cases, and more stable. #644
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
/gemini review |
|
@csells , how does it look? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces several significant and beneficial changes to make the library more flexible and robust. Key changes include removing the enforced ChatMessageRole, renaming ChatMessage to Message, and using constants for JSON keys to reduce message size. The test coverage has been significantly improved, including a new test for the example, which is excellent. My review focuses on some opportunities to improve code clarity and leverage more modern Dart features, such as using const constructors for immutable classes and refactoring switch statements/expressions for better readability. Overall, this is a solid pull request that improves the library's design and usability.
| /// If there are many parts of type [TextPart], the [text] property | ||
| /// will be a concatenation of all of them. | ||
| /// Many text parts is convenient to have to support | ||
| /// streaming of the message. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does "streaming of the message" mean? Standard Dart Streams are asynchronous constructs, but parts are provided as one synchronous List.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When LLM sends message, it adds parts one by one and, as result, text is split. Stream is not needed here, because the message is created when full message is received. I removed this comment, because, yes, it causes questions.
| description: | ||
| name: characters | ||
| sha256: f71061c654a3380576a52b451dd5532377954cf9dbd272a78fc8479606670803 | ||
| sha256: faf38497bda5ead2a8c7615f4f7939df04333478bf32e4173fcb06d428b5716b |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you need to update pubspec.lock to make your code changes to work? Or was this by accident?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did not mean, but it is unavoidable side effect of storing pubspec.lock in git - sometimes it changes.
|
My 2c on the |
|
We discussed the subject on the doc and agreed:
See details in go/message-in-ai-primitives Will update the PR shortly. |
Contributes to #607
dynamicwithObject?Partto make the class extendable.collectionfor deep collection comparison